Empirical Evaluation of Four Tensor Decomposition Algorithms
نویسنده
چکیده
Higher-order tensor decompositions are analogous to the familiar Singular Value Decomposition (SVD), but they transcend the limitations of matrices (second-order tensors). SVD is a powerful tool that has achieved impressive results in information retrieval, collaborative filtering, computational linguistics, computational vision, and other fields. However, SVD is limited to two-dimensional arrays of data (two modes), and many potential applications have three or more modes, which require higher-order tensor decompositions. This paper evaluates four algorithms for higher-order tensor decomposition: Higher-Order Singular Value Decomposition (HO-SVD), HigherOrder Orthogonal Iteration (HOOI), Slice Projection (SP), and Multislice Projection (MP). We measure the time (elapsed run time), space (RAM and disk space requirements), and fit (tensor reconstruction accuracy) of the four algorithms, under a variety of conditions. We find that standard implementations of HO-SVD and HOOI do not scale up to larger tensors, due to increasing RAM requirements. We recommend HOOI for tensors that are small enough for the available RAM and MP for larger tensors.
منابع مشابه
A simple form of MT impedance tensor analysis to simplify its decomposition to remove the effects of near surface small-scale 3-D conductivity structures
Magnetotelluric (MT) is a natural electromagnetic (EM) technique which is used for geothermal, petroleum, geotechnical, groundwater and mineral exploration. MT is also routinely used for mapping of deep subsurface structures. In this method, the measured regional complex impedance tensor (Z) is substantially distorted by any topographical feature or small-scale near-surface, three-dimensional (...
متن کاملHOID: Higher Order Interpolatory Decomposition for tensors based on Tucker representation
We derive a CUR-type factorization for tensors in the Tucker format based on interpolatory decomposition, which we will denote as Higher Order Interpolatory Decomposition (HOID). Given a tensor X , the algorithm provides a set of column vectors {Cn}n=1 which are columns extracted from the mode-n tensor unfolding, along with a core tensor G and together, they satisfy some error bounds. Compared ...
متن کاملNon-orthogonal tensor diagonalization
Tensor diagonalization means transforming a given tensor to an exactly or nearly diagonal form through multiplying the tensor by non-orthogonal invertible matrices along selected dimensions of the tensor. It has a link to an approximate joint diagonalization (AJD) of a set of matrices. In this paper, we derive (1) a new algorithm for a symmetric AJD, which is called two-sided symmetric diagonal...
متن کاملDecomposition of Big Tensors With Low Multilinear Rank
Tensor decompositions are promising tools for big data analytics as they bring multiple modes and aspects of data to a unified framework, which allows us to discover complex internal structures and correlations of data. Unfortunately most existing approaches are not designed to meet the major challenges posed by big data analytics. This paper attempts to improve the scalability of tensor decomp...
متن کاملSublinear Time Orthogonal Tensor Decomposition
A recent work (Wang et. al., NIPS 2015) gives the fastest known algorithms for orthogonal tensor decomposition with provable guarantees. Their algorithm is based on computing sketches of the input tensor, which requires reading the entire input. We show in a number of cases one can achieve the same theoretical guarantees in sublinear time, i.e., even without reading most of the input tensor. In...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/0711.2023 شماره
صفحات -
تاریخ انتشار 2007